RGBD Human Activity Recognition by Multi-Modal Context Fusion

نویسندگان

  • Bingbing Ni
  • Jiashi Feng
  • Pierre Moulin
چکیده

We propose a novel complex activity recognition and localization framework which effectively fuses information from both grayscale and depth image channels at multiple levels of the video processing pipeline. In the individual visual feature detection level, depth based filters are applied to the detected human/object rectangles to remove false detections. In the next level of interaction modeling, three dimensional spatial and temporal contexts among human subjects or objects are extracted by integrating information from both grayscale and depth images. Depth information is also utilized to distinguish different types of indoor scenes. Finally, a latent structural model is developed to integrate the information from multiple levels of video processing for activity detection. Extensive experiments on a challenging grayscale + depth human activity database which contains complex interactions between human-human, humanobject and human-surroundings demonstrate the effectiveness of the proposed multi-level grayscale + depth fusion scheme.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Multi-Modal RGBD Sensors for Object Grasping and Manipulation

RGBD sensors, such as the Microsoft Xbox Kinect [1] are types of multi-modal perceptual sensors that have appeared in recent years. RGBD sensors have become standard perceptual tools for robots as they provide a unique multi-modal approach to perception. A vital pre-cursing challenge in object grasping and manipulation is object pose recognition. A robot must identify the pose (i.e. orientation...

متن کامل

Weakly-supervised DCNN for RGB-D Object Recognition in Real-World Applications Which Lack Large-scale Annotated Training Data

This paper addresses the problem of RGBD object recognition in real-world applications, where large amounts of annotated training data are typically unavailable. To overcome this problem, we propose a novel, weakly-supervised learning architecture (DCNN-GPC) which combines parametric models (a pair of Deep Convolutional Neural Networks (DCNN) for RGB and D modalities) with non-parametric models...

متن کامل

User Identification and Object Recognition in Clutter Scenes Based on RGB-Depth Analysis

We propose an automatic system for user identification and object recognition based on multi-modal RGB-Depth data analysis. We model a RGBD environment learning a pixel-based background Gaussian distribution. Then, user and object candidate regions are detected and recognized online using robust statistical approaches over RGBD descriptions. Finally, the system saves the historic of user-object...

متن کامل

Hybridization of Facial Features and Use of Multi Modal Information for 3D Face Recognition

Despite of achieving good performance in controlled environment, the conventional 3D face recognition systems still encounter problems in handling the large variations in lighting conditions, facial expression and head pose The humans use the hybrid approach to recognize faces and therefore in this proposed method the human face recognition ability is incorporated by combining global and local ...

متن کامل

On the Improvements of - Uni-modal and Bi-modal Fusions of Speaker and Face Recognition for Mobile Biometrics

The MOBIO database provides a challenging test-bed for speaker and face recognition systems because it includes voice and face samples as they would appear in forensic scenarios. In this paper, we investigate uni-modal and bimodal multi-algorithm fusion using logistic regression. The source speaker and face recognition systems were taken from the 2013 speaker and face recognition evaluations th...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2014